Frictionless design, frictionless racism
Is the shift from frictionless design to meaningful friction combating racial bias amongst users?
In recent years, the tech industry has become obsessed with eliminating friction. Experiences that allow users to achieve their goals as quickly and easily as possible are increasingly desirable; a marker of success in an industry preoccupied with convenience and optimisation.
This approach, referred to as ‘frictionless design’, is seen across much of the digital products available to users today. Renting accommodation, calling a taxi, sharing an opinion with like-minded friends; the rise of near-frictionless platforms, such as Airbnb and Uber, means it has never been easier to complete tasks that were once difficult and time-consuming.
However, recent events, such as the continuing spike in online discrimination following the election of US President Donald Trump and the Black Lives Matter Movement, have called frictionless experiences into question. Seemingly ever-present instances of racial discrimination across the digital landscape have begun to reveal the role of frictionless design in the reinforcement and perpetuation of racial bias amongst users.
With rising instances of racism and polarisation online, designers are beginning to call for the reintroduction of friction in design to challenge racial biases. And some companies are starting to listen.
What Is Frictionless Design?
Defining Frictionless Design
“In User Experience, friction is defined as “interactions that inhibit people from intuitively and painlessly achieving their goals within a digital interface.”¹ This may include, for example, modals or pop-ups, which interrupt the user’s journey and slow their progress. Put simply, friction acts between the user and the product. As a result, eliminating friction has become a focus of the tech industry, creating a culture of optimisation², and giving rise to the principle of frictionless design.
Frictionless design produces experiences that answer “a user’s wants and needs as quickly and seamlessly as possible.”³ The frictionless ethos is perhaps best defined within Steve Krug’s influential book ‘Don’t Make Me Think’⁴, which highlights usability best practice. Krug’s dominant principle of usability, and one core to Frictionless Design, is “don’t make me think”. Krug argues an experience should require minimal cognitive effort from users. In other words, a truly frictionless experience, and its subsequent interactions, should take little to no thought to understand. Furthermore, efficient, frictionless experiences should enable users to complete tasks as quickly and efficiently as possible.
The elimination of friction is also largely supported within wider usability best practice, which prioritises “ease of interaction between the user and the device or application.”⁵ For example, in his seminal book ‘The Design of Everyday Things’⁶, Don Norman puts forward his ‘Seven Principles for Transforming Difficult Tasks into Simple Ones’. His second principle, “simplifying the structure of tasks”, outlines the crux of Frictionless Design. Norman suggests that tasks should be simple in order to minimise the amount of problem-solving required, as users have limited attention they’re willing to commit to tasks. Therefore, experiences should cater to this through minimising interruption, or rather, friction.
In short, frictionless design generates experiences that are simple, convenient, and fast. Thus allowing users to accomplish tasks using minimal cognition and effort.
Popularised by Krug and Norman’s workings, frictionless design can be seen to have been widely adopted within the tech industry. From Facebook’s ‘Frictionless Sharing’ model to Uber’s self-proclaimed ‘seamless pickups’, an array of companies boast frictionless-led products and have integrated the frictionless ethos into their approach to User Experience (UX). But how did the popular convenience-focused approach come to popularity? And why did it emerge?
How Did Frictionless Design Become The Standard?
“A seamless, frictionless user experience has now become the new standard: the minimum to be relevant in the fast-paced world of technology”¹ is the crucial point made by Victoria Young in her examination of strategic UX. Kevin Roose corroborates this view within his article ‘Is Tech Too Easy To Use?’, in which he observes that within the last decade, eliminating friction “has become an obsession of the tech industry, accepted as gospel by many of the world’s largest companies.”² Indeed, amongst articles, user feedback, books, and the like, a growing consensus is that a frictionless, convenient user experience is necessary for a product to succeed. The popularisation of frictionless design can be directly linked to two crucial factors: a highly competitive product market and rising user expectations.
A Highly Competitive Product Market
In his ‘Simplicity Thesis’, Aaron Levie notes that if a product is not minimally complex or slows users down, it will be overtaken by a newer, more streamlined competitor and risk extinction.⁷ Therefore, companies seek to eliminate friction within a product to increase and maintain user engagement⁸ and growth, providing a significant competitive advantage over direct competitors.
Airbnb, which simplified the process of listing and booking accommodation, is a key example of frictionless-led success. A study conducted by analytics company 7Park Data found that between 2009 and 2016, Airbnb’s listings grew from 3,000 to 2.3 million respectively — with a compound annual growth rate of 153%.⁹ Not only did they overtake the listings of their closest competitor, HomeAway, but with a presence in 191 countries, they have surpassed hospitality giants Marriott and Hilton.¹⁰
Rising User Expectation
Airbnb isn’t an outlier. Other digital platforms, such as Uber and Facebook, have seen significant success and profit by adopting a frictionless approach. And as more and more products deliver frictionless experiences, this is setting a new standard for users.
Users expect products to work without disruption and aren’t willing to accept inconvenience or invest additional time to accomplish tasks.¹¹ Indeed, Facebook IQ’s ‘Zero Friction Future’¹² report found that, among users, there is a rapidly rising demand for convenient experiences.
Studies appear to support this view. Christiane Lemieux and Duff McDonald’s book ‘Frictionless’¹¹ examines how startups leverage the ‘frictionless’ concept to overtake competitors. Findings observed that users who experience pain points within a product are likely to switch to a competitor with a more effortless, ‘hassle-free’ experience. Lemieux and McDonald make the critical observation that creating a good customer experience is essential, as users are used to having their needs met almost instantly. And because users are demanding convenience, it can be suggested that frictionless design has become a mandate rather than a preference.
Both factors discussed above feed into and perpetuate one another. As products become more effortless and convenience-driven, user expectations rise while their tolerance for slower experiences drops. This cycle has occurred with minimal consideration from companies, or the tech industry at large, into what repercussions may occur. What effect, outside of higher expectations, are these products having on users? How is the continual minimisation of friction impacting their wider behaviour and psychology?
The Consequences Of Convenience
As previously illustrated, frictionless design prioritises speed, simplicity, and fast decision-making. However, according to Jennifer Eberhardt, a Social Psychologist at Stanford University, these core elements provide the perfect conditions for bias — the prejudice in favour or against a person or group of people¹³ - to flourish in users.¹⁴ Indeed, racial bias has seemingly had a significant re-emergence in online communities. In 2015, an academic study found that merely having access to broadband corresponded to increased racial hate crimes, such as harassment and violence.¹⁵ But how does frictionless design enable racial biases to get to this point?
Effortless Experiences & Implicit Bias Affirmation
“The most pervasive form of bias is implicit bias. Implicit bias, otherwise known as unconscious bias or implicit social cognition, is defined as the “attitudes and stereotypes that affect our behaviours, our decisions, and our attitudes unconsciously”.¹⁶ For example, a user may have an automatic preference for one race over another and be completely unaware of it.
Implicit bias is present in all users and may even oppose their conscious beliefs without their awareness.¹⁶ This presents issues when users interact with frictionless interfaces. When a user engages with an interface designed to be frictionless and fast, this can result in mindless interactions driven by implicit bias.⁸ This provides little opportunity for users to examine, or even be aware of, their behaviour. Therefore, it can be proposed the minimal cognition required by frictionless design allows users’ implicit racial biases to go unquestioned, enabling racist behaviour driven by invisible stereotypes. In so doing, these biases validate themselves.¹⁷ Furthermore, without confrontation, racial biases can appear more frequently in the user’s behaviour and manifest as microaggressions.
A microaggression is “a subtle, automatic, and often nonverbal [action] that communicates hostile, derogatory, or negative prejudicial slights and insults”¹⁷ towards a group. An example of microaggressions is the implicit racial bias perpetuated by Airbnb hosts.
When a user applies to rent accommodation, Airbnb allows the ‘host’ — the individual who owns the property — to accept or reject their booking after reviewing their application information. This typically consists of the user’s name and a photograph. However, according to researchers at Harvard Business School, this design choice enables racial discrimination. Their study found that guests with distinctly African-American names are 16% less likely to be accepted to rent a property than guests with distinctively ‘white’ names.¹⁹ These racial disparities were seen in every type of host across every city studied.
According to Laura Murphy, an African American civil rights attorney hired by Airbnb to address their racism problem, the biggest challenge, and the core of the issue, was unconscious bias.¹⁷ Hosts can review a name, activating implicit racial bias, and reject a guest with little effort. It can be argued that this frictionless experience does not provide sufficient conditions for hosts to properly examine why they are rejecting someone, enabling implicit racial bias to take over the decision-making process. This serves to nurture implicit biases¹⁷ and allows it to continue manifesting in user behaviour.
Implicit Racial Bias Becomes Explicit Racial Bias
As previously illustrated, racial bias, and its consequent behaviours, is more likely to flare up when users’ decisions are left unquestioned and unmonitored. Furthermore, researchers at Harvard Business School have also found that implicit biases can, and do, worsen over time.²⁰ This means users’ implicit racial biases could potentially develop into explicit racial biases. Explicit biases are the “attitudes and beliefs we have about a person or group on a conscious level”.²¹ Unlike implicit bias, expressions of explicit bias, including racial discrimination and hate speech, occur due to intentional thought and are controllable. This change from implicit to explicit is intensified by several factors, including the information users are exposed to.²¹
The stream of information across digital platforms is not organised in a balanced way. Exacerbated by frictionless design, information is increasingly organised by algorithms that seek to maximise platform usage²² through cultivating convenient, frictionless, and engaging experiences for users. Algorithms achieve this by personalising content displayed to users based on their browsing history, gender, location, age, and other data.²³ This data informs, for example, the posts selected to appear in Facebook’s Newsfeed or the top search results in Google; little effort or cognition is required to locate enjoyable, individualised content. However, this results in a worldview catered to fit users’ individual preferences and biases, known as a ‘filter bubble’ of content.²⁴ This is amplifying and emboldening implicit racial biases to the point that they become explicit.
A key example is Facebook’s News Feed. When a user, driven by implicit racial bias, engages with racially-charged content (liking, sharing, commenting), Facebook’s recommendation algorithm retrieves this data and is trained to recommend similar content.²⁵ As a result, the user sees more and more content supporting their implicit racial biases, gradually separating them into a racist filter bubble. This information segregation removes contradictory viewpoints and users from view, trapping the user in an epistemic bubble²⁶ - a News Feed where everyone the user sees shares the same racist opinions. Containment within an epistemic bubble can make users more conscious of their implicit racial biases and, by solely seeing racist content, assume these perspectives are the norm.²³ This shifting of norms allows implicit bias to emerge explicitly.
Through filter and epistemic bubbles, platforms create a safe, frictionless space for overt displays of racism. According to hate group experts, the number of white supremacists is rising as social media connects like-minded users¹⁶ and creates a space where users know they will face no confrontation, or social friction, for sharing racially-charged content.
Empowering Racial Bias: Google & Dylann Roof
Despite being a source of user engagement and convenience, frictionless design has been illustrated to reinforce bias and embolden explicit, racist behaviour amongst users. Unfortunately, it is also generating consequences outside of the digital landscape.
While searching for information about Trayvon Martin, an African American teenager shot and killed by police, Dylann Roof encountered the phrase ‘black on white crime’.²⁷ Googling the phrase led Roof to the website of the largest white supremacist group in the United States²⁸, the Council of Conservative Citizens (CCC), who posed as a legitimate conservative news site. He continued to scour the CCC, and his subsequent searches “led him deeper and deeper into the world of online hate groups and propaganda”²⁷, including to the false claim that Black violence on White Americans was at crisis level.²⁹ With his filter bubble significantly narrowed and his explicitly racist beliefs firmly established, Roof declared himself racially aware³⁰ in an online manifesto. On June 17th 2015, he entered the Emanuel African Methodist Episcopal Church and shot dead nine African American worshippers.
Dylann Roof’s racial radicalisation cannot be attributed to one factor; everything from Google’s algorithm to Roof’s psychology played a role. However, it can be argued that Google’s frictionless design aided the development and empowerment of Roof’s explicit racial bias. According to sociologist Miriam E. Sweeney, Google’s “simple, sparse design works to obscure the interface’s complexity, making the results appear purely scientific and data-driven”.³¹ In other words, the lack of friction and simplification of Google’s interface helps blur objective sources and makes it challenging to identify racist, paid search results — which rank higher in the search engine. Furthermore, the dominant perception that top search results are both popular and objective allows users to believe racist search results mirror collective norms and beliefs. This normalisation of racist content provides the empowerment needed for implicit racial biases to be expressed explicitly and confidently.
The Rise Of Meaningful Friction
As evidenced previously, frictionless design creates a significant issue in the affirmation and empowerment of implicit and explicit racial biases. This raises the question: how can design now begin to address, challenge and reduce these biases in users?
What Is Meaningful Friction?
There is a rising tide of designers within the tech industry advocating the utility of friction in design. Upon interviewing numerous designers, product managers, and tech executives, Kevin Roose found the majority opinion was that reintroducing friction would help foster healthier, more tolerant behaviour in users.² Indeed, entering ‘frictionless design’ into a search engine now seemingly returns as many results advocating for reintroducing design friction as those calling for its elimination. This call-to-action has been intensified in the wake of racist injustices and incidents suffered by minorities due to frictionless design. A recent example being the ease and speed at which racist, anti-Chinese content spread across digital platforms as Covid-19 entered the public discourse.³²
It is important to note that, in this context, friction holds a different meaning than in the traditional design sense. Designers are not advocating the addition of friction that would add unnecessary hurdles, cause frustration, and generally hinder a user’s experience.
Instead, there is a growing movement behind the concept of meaningful friction — friction that has been intentionally designed and introduced into user interactions with care.⁸ Such friction points can elicit reflective, informed, and positive interactions⁸ and support positive behaviour change in users. For example, when given the time to reflect a user may recognise racist impulses and behaviour and alter their approach to an interaction. According to Bill Gribbons, Professor of Information Design and Corporate Communication at Bentley University, the shift towards introducing purposefully-designed friction is key in addressing users’ implicit biases.³³
This isn’t to say that established principles should be discounted. In general, the usability principles championed by Don Norman and Steve Krug are useful in creating effective user experiences. However, meaningful friction should be employed in contexts that have heightened criticality or sensitivity. For example, to prevent and challenge instances of racial bias perpetuated by users, as seen in the previous section. But how does purposefully-designed friction combat racial bias in users? And how does it elicit positive behaviour change?
Two Systems Of Thinking
Cognitive psychologists posit that people have two systems of thought; System 1 and System 2.³⁴ Each system has its own distinct characteristics that influence user behaviour. System 1 thinking is automatic, involuntary, and very fast in delivering information to the user. This system is where implicit racial biases reside. Conversely, System 2 is characterised by slower, more deliberate thought. A significant function of System 2 is to monitor and control the thoughts, biases and actions “suggested” by System 1, enabling some to be expressed in behaviour and others to be modified or suppressed.³⁴
Numerous factors determine whether System 1 or System 2 thinking is used, including the complexity of an interaction.⁸ Because frictionless design generates simple, fast interactions that require low cognition and effort, System 1 thinking is typically engaged. This can result in mindless interactions completed on ‘autopilot’³⁴, as users instinctively and automatically react to an interface. As previously illustrated, this low-effort user engagement can affirm and empower users to act on implicit racial bias.
However, meaningful friction works to challenge racial biases in users through triggering more deliberate conscious thought. The addition of friction slows users down and creates a level of cognitive strain³⁴ - instead of the minimal cognition required by frictionless design. Cognitive strain, and the resultant increase in user effort to complete a task, mobilises System 2 and allows users to engage with interactions from an engaged, logical perspective. This ensures that users consciously consider their actions and have time to reflect on possible outcomes, rather than instinctively acting on implicit racial biases. Furthermore, by engaging System 2, and its inherent skepticism of intuition³⁴, it increases the likelihood that racial biases instinctively endorsed by System 1 will be modified or suppressed before they can be acted out as behaviour.
An example of how designed friction can result in more reflective, conscious behaviour is provided by Hedeen et al.³⁵ In their experiment, participants were required to complete a short test, at the end of which they encountered design friction — a modal that reminded them to check their answers. Before users interacted with design friction questions were answered correctly less than 50% of the time, suggesting System 1 was utilised initially. After introducing and interacting with design friction, System 2 was engaged, and the number of median correct responses per participant increased for every question. These results reflect that design friction encourages System 2 thinking over System 1, increasing user attention and accuracy when completing a task.
Therefore, it can be argued that the inclusion of design friction could facilitate deeper levels of involvement in users and encourage them to reflect on their racially-driven behaviour. Furthermore, this reflection increases the likelihood of positive behaviour change in users as they engage with a platform.
Microboundaries & Value-Led Behaviour
Hedeen et al.’s study demonstrated that design friction creates reflective interactions and encourages positive behaviour change. This is further supported by the research of Cox et al.⁸, who argue that mindful interactions can be achieved via small, single moments of friction in interaction, which they describe as microboundaries.
A microboundary is “an intervention that provides a small obstacle prior to an interaction that prevents us rushing from one context to another.”⁸ It achieves this by creating a small moment where users can reflect on, and become more mindful of their behaviour. The short pause in an interaction prompts a shift from System 1 to the more logical, System 2 driven behaviour. This provides users the opportunity to recognise racist behaviour, or at minimum, recognise negative behaviour unknowingly driven by implicit racial bias. As previously illustrated, this shift encourages positive behaviour change. In addition to this, according to Cox et al., microboundaries also support value-led behaviour.
In this instance, value-led behaviour refers to behaviour that aligns with a users’ conscious beliefs and intent. However, as outlined in Section 2, a user could act on implicit racial biases while also consciously having anti-racist values.¹⁶ This clash has been examined by social psychologists Mahzarin Banaji and Anthony Greenwald, who created the Implicit Association Test (IAT). The IAT measures the relationship between System 1 and System 2 in regards to racial perception. In analysing participant responses, Banaji and Greenwald found a conflict between System 1 and System 2 regarding how race was perceived.³⁶ Typically participants would self-report that they did not have any racial biases. Nevertheless, test results revealed that implicit racial biases covertly guided participant behaviour and social cognition.
Therefore, it can be proposed that microboundaries, and meaningful friction as a whole, could challenge racial bias through facilitating behaviour that aligns with users’ positive, conscious beliefs about race. This value-led behaviour is achieved by shifting to System 2 thinking, which suppresses intuitive, racially-biased behaviours and enables users to have more conscious, tolerant involvement in an interaction.
The Impact Of Meaningful Friction
The positive implications of introducing meaningful friction are clear, and some companies have taken encouraging steps in designing experiences that challenge racial bias. But how effective has this been in reducing racial bias in users? And is the wider tech industry open to the reintroduction of design friction?
ReThink
Research has found that adolescents aged 12 to 18 are 40% more likely to post offensive content online than any other age group.³⁷ This is because the brain’s prefrontal cortex, which deals with decision-making, isn’t fully developed until age 25.³⁷ As a result, the adolescent brain significantly utilises System 1 thinking, making high-speed decisions without pausing to reflect. As previously illustrated, this can result in racist content being published and shared online with ease.
ReThink is a platform that challenges racial bias and racist behaviour in users by implementing meaningful friction in user interactions. To promote positive, tolerant behaviour and a switch to System 2 thinking, a modal will appear when a user tries to share hateful content.³⁸ This modal states that the content is offensive and asks the user if they still wish to publish it.
ReThink provides a key example of how promoting value-led behaviour can diminish instances of racial biases in users. By simply stating to a user that their behaviour is hateful or racially insensitive, it allows users to identify their racial bias and potentially change their behaviour. Indeed, ReThink found that including friction increased user awareness, while willingness to post offensive content decreased from 71.4% to 4.6%.³⁷ Overall, 93% of users decided not to post offensive content after engaging with the modal.³⁷
Nextdoor
Nextdoor, a social networking service, connects users in local neighbourhoods and contains a network of local bulletin boards. Users can communicate easily, posting content such as missing pet information and community events. However, Nextdoor’s “Crime and Safety” category, which allows users to post about suspicious activity in their community, quickly became a hub for expressions of racial bias.³⁹ Large numbers of posts with racist overtones were shared, and the section was littered with instances of racial profiling. For example, users labeling Black and Latino individuals as ‘suspicious’ for simply walking down the street.¹⁷
The problem was Nextdoor’s reporting system. The process of posting suspicious activity was virtually frictionless, as users simply had to fill in a blank form with a subject line. Because users could write whatever they wanted, racial bias flourished, resulting in reports disproportionately focused on race.¹⁷ According to Sarah Leary, one of Nextdoor’s founders, this lack of friction meant that many users were unconsciously racially profiling.²⁷ In other words, users knew they had seen something that concerned them, and driven by implicit racial bias, made an intuitive decision to post.
To tackle their racial profiling issue, Nextdoor slowed users down. They introduced friction into the process of posting suspicious behaviour by adding steps to their “Crime and Safety” report. For example, the report’s description section now contains multiple specific fields, such as age, gender, and race. If a user enters data into the race field, they are required, at minimum, to fill two additional fields about a subject’s appearance.²⁷ Furthermore, Nextdoor has also added a checklist users must go through before reporting an incident, which explicitly reminds them not to focus solely on race.
By introducing meaningful friction into the users’ experience, Nextdoor reduced the incidence of racial profiling on their platform by 75%.¹⁷
Similarly to ReThink, Nextdoor’s design friction stops users before posting racist content and encourages reflective, System 2 thinking. According to Jennifer Eberhardt, who was tasked with addressing Nextdoor’s racism issue, when users have this time to reflect on their decisions, it can lead to more open-minded interactions and critical thought.¹⁷ The report checklist further amplifies this. By explicitly mentioning race, users become more conscious of their behaviour. As a result, this enables them to become more mindful of their own implicit racial biases and impulses.
A Precedent Set
As racist incidents continue to occur across the digital landscape, meaningful friction is being introduced more frequently across the tech industry. Indeed, an increasing number of major platforms appear to be following the precedent set by ReThink and Nextdoor in recognising the value of friction.
In 2018, Whatsapp limited their message forwarding in India to suppress misinformation after reports that viral threads led to riots and lynchings.³⁹
YouTube has toughened its rules surrounding if certain channels earn ad revenue⁴⁰ in order to make it more difficult for racist content and extremists to be present on the platform. Perhaps one of the most recent examples is the introduction of restricted replies on Twitter⁴¹, which seeks to limit users who can reply to tweets in order to promote thoughtful interactions and remove racist trolls from conversations.
Of course, there are platforms that are less willing to address racial bias as directly — friction is still seen as a threat to user engagement. Facebook founder, Mark Zuckerburg, argued that the platform should take a ‘hands-off approach’ to what users post⁴², and provides limited friction when it comes to user-generated content. This ‘hands-off approach’ included no action being taken against US President Donald Trump’s inflammatory posts surrounding the Black Lives Matter protests.
However, the uptick in meaningful friction is a promising sign that the tech industry is beginning to take more robust, explicit action against racial bias in users.
Conclusion
By examining the impact of design friction on user behaviour and cognitive processes, this research has shown that purposeful, meaningfully designed interruptions can successfully challenge racial bias.
The standardisation of frictionless design across today’s digital landscape stems from a desire to create the most simple, fast and convenient experience possible. The user satisfaction generated by such simplistic interactions drives user engagement and has created a highly competitive product market in which the elimination of friction is seen as a means to overtake competitors. However, there has been little consideration by the tech industry surrounding how this approach impacts users.
Research by social psychologists has shown that frictionless design has allowed racial bias to flourish in users. The lowered cognition required enables implicit racial biases to guide user behaviour, often manifesting as racist microaggressions. Empowered by by-products of frictionless design, including filter bubbles, these implicit racial biases are nurtured and developed to the point that they become explicit. The outcome of this is conscious, racist beliefs that are intentionally acted upon. This is creating an increasingly hostile digital landscape, where polarised, racist behaviour is on the rise.
To combat these racial attitudes, there has been increased support within the design community behind the inclusion of meaningful friction. These
purposefully-designed friction points elicit reflective behaviour through activating engaged System 2 thinking, enabling racial biases to be suppressed and modified. As a result, users experience positive behaviour change which acts against intuitive racial bias.
From Nextdoor’s plunging instances of racial profiling to ReThink’s impactful prevention of hate posting, the successes of meaningful friction are apparent and have begun to make an impact within the tech industry — with it increasingly being seen as a viable means to challenge hateful behaviour. And as racist incidents perpetuated by users continue to occur, more and more companies are implementing meaningful friction in their interfaces to directly challenge racial bias.
Bibliography
- Young, Victoria. 2015. “Strategic UX: The Art Of Reducing Friction”. Telepathy. https://www.dtelepathy.com/blog/business/strategic-ux-the-art-of-reducing-friction.
- Roose, Kevin. 2018. “Is Tech Too Easy To Use?”. The New York Times. https://www.nytimes.com/2018/12/12/technology/tech-friction-frictionless.html.
- Unknown. 2019. “The Tyranny Of Frictionless Design”. Laptrinhx. https://laptrinhx.com/the-tyranny-of-frictionless-design-494227525/.
- Krug, Steve. 2005. Don’t Make Me Think, Revisited: A Common Sense Approach To Web Usability. 3rd ed. New Riders.
- Cox, Anna L., Sarah Wiseman, and Duncan P. Brumby. 2013. “Designing Devices With The Task In Mind: Which Numbers Are Really Used In Hospitals?”. doi:10.1177/0018720812471988.
- Norman, Don. 2013. The Design Of Everyday Things: Revised And Expanded Edition. 2nd ed. Basic Books.
- Levie, Aaron. 2012. “The Simplicity Thesis”. Fast Company. https://www.fastcompany.com/1835983/simplicity-thesis.
- Cox, Anna L., Marta E. Cecchinato, Sandy Gould, Ioanna Iacovides, and Ian Renfree. 2016. “Design Frictions For Mindful Interactions : The Case For Microboundaries”. CHI’16 Extended Abstracts On Human Factors In Computing Systems. doi:10.1145/2851581.2892410.
- Love, Tessa. 2021. “Airbnb Overtakes Hotel Industry With Red-Hot Growth”. The Business Journals. https://www.bizjournals.com/sanfrancisco/blog/2016/05/airbnbs-growth-overcomes-hotels-rooms.html.
- Mirza, Alexander. 2019. “A New Era Of Lodging: Airbnb’S Impact On Hotels, Travelers, And Cities”. Medium. https://medium.com/harvard-real-estate-review/a-new-era-of-lodging-airbnbs-impact-on-hotels-travelers-and-cities-de3b1c2d5ab6.
- Lemieux, Christiane, and Duff McDonald. 2020. Frictionless. Harper Business.
- Facebook IQ. Undated. “Zero Friction Future”. Facebook. https://scontent.flhr4-1.fna.fbcdn.net/v/t39.8562-6/10000000_2437228363039997_453153967812116480_n.pdf/ZFF_Report_UK.pdf?_nc_cat=104&ccb=2&_nc_sid=ad8a9d&_nc_ohc=uT2uc3MeeCoAX_h_DFx&_nc_ht=scontent.flhr4-1.fna&oh=ebeba71a742342864aa8aeb7fd8b3fed&oe=601A8356.
- Fuciarelli, Megan. 2018. When Implicit Bias Becomes Explicit. Video. TED.
- Apple Podcasts. 2020. “Recode Decode: Psychologist Jennifer Eberhardt”. Podcast. Decoder With Nilay Patel. https://podcasts.apple.com/lt/podcast/psychologist-jennifer-eberhardt-on-how-hidden-biases/id1011668648?i=1000448340404.
- Chan, J., Ghose, A. and Seamans, R. 2016. The Internet and Racial Hate Crime: Offline Spillovers from Online Access. MIS Quarterly, 40(2). https://www.researchgate.net/publication/319132608_The_Internet_and_Racial_Hate_Crimes_Offline_Spillovers_from_Online_Access.
- Funchess, Melanie. 2014. Implicit Bias — How It Effects Us And How We Push Through. Video. TEDx Talks.
- Eberhardt, Jennifer. 2019. Biased. William Heinemann.
- Ruhl, Charlotte. 2020. “Implicit Or Unconscious Bias”. Simply Psychology. https://www.simplypsychology.org/implicit-bias.html.
- Edelman, Benjamin, Michael Luca, and Dan Svirsky. 2017. “Racial Discrimination In The Sharing Economy: Evidence From A Field Experiment”. American Economic Journal: Applied Economics 9 (2). doi:10.1257/app.20160213.
- Charlesworth, Tessa E. S., and Mahzarin R. Banaji. 2019. “Patterns Of Implicit And Explicit Attitudes: I. Long-Term Change And Stability From 2007 To 2016”. doi:10.1177/0956797618813087.
- “Explicit Bias”. 2021. Perception Institute. https://perception.org/research/explicit-bias/.
- Sîrbu, Alina, Dino Pedreschi, Fosca Giannotti, and Janos Kertesz. 2018. “Algorithmic Bias Amplifies Opinion Polarization: A Bounded Confidence Model”. doi:10.1371/journal.pone.0213246.
- “How Filter Bubbles Distort Reality: Everything You Need To Know”. Undated. Blog. Fs. https://fs.blog/2017/07/filter-bubbles/.
- Pariser, Eli. 2011. The Filter Bubble: How The New Personalized Web Is Changing What We Read And How We Think. Penguin Books.
- Meenakshi Sadagopan, Swathi. 2019. “Feedback Loops And Echo Chambers: How Algorithms Amplify Viewpoints”. The Conversation. https://theconversation.com/feedback-loops-and-echo-chambers-how-algorithms-amplify-viewpoints-107935.
- Thi Nguyen, C. 2018. “Echo Chambers And Epistemic Bubbles”. doi:10.1017/epi.2018.32.
- Wachter-Boettcher, Sarah. 2017. Technically Wrong: Sexist Apps, Biased Algorithms, And Other Threats Of Toxic Tech. 1st ed. W. W. Norton & Company.
- Graham, David. 2015. “The Council Of Conservative Citizens: The White-Supremacist Group That Inspired A Racist Manifesto”. The Atlantic. https://www.theatlantic.com/politics/archive/2015/06/council-of-conservative-citizens-dylann-roof/396467/.
- Umoja Noble, Safiya. 2018. Algorithms Of Oppression: How Search Engines Reinforce Racism. NYU Press.
- The website of Dylann Roof — has been taken down but can be accessed at:
“Dylann Roof’S Journal”. 2019. The Post & Courier. https://www.postandcourier.com/dylann-roofs-journal/pdf_c5f6550c-be72-11e6-b869-7bdf860326f5.html. - Sweeney, Miriam E. 2013. “Not Just A Pretty (Inter)Face: A Critical Analysis Of Microsoft’S ‘Ms.Dewey’”. Ph.D, University of Illinois.
- Kozlowska, Hanna. 2020. “How Anti-Chinese Sentiment Is Spreading On Social Media”. Quartz. https://qz.com/1823608/how-anti-china-sentiment-is-spreading-on-social-media/.
- Quito, Anne. 2020. “The Push To Redefine “Good Design” Amid The Black Lives Matter Movement”. Quartz. https://qz.com/1869239/how-ux-design-can-counter-racial-bias/.
- Kaneham, Daniel. 2012. Thinking Fast And Slow. 1st ed. Penguin.
- Hedeen, Laura. 2020. “When Design Friction Is A Good Thing”. Graduate, Iowa State University.
- Yim, Dan. 2021. “Hidden In Plain Sight: Cognitive Bias And Thinking Fast & Slow About Implicit Racial Bias”. Shortreads. https://cct.biola.edu/hidden-plain-sight-cognitive-bias-and-thinking-fast-slow-about-race/.
- Prabhu, Trisha. 2014. Rethink Before You Type. Video. TEDx Talks.
- Thomas, David Dylan. 2020. Cognitive Bias. A Book Apart.
- Hern, Alex. 2018. “Whatsapp To Restrict Message Forwarding After India Mob Lynchings This Article Is More Than 2 Yea”. The Guardian. https://www.theguardian.com/technology/2018/jul/20/whatsapp-to-limit-message-forwarding-after-india-mob-lynchings.
- Lerman, Rachel. 2019. “Youtube Cracks Down On Racist, Sexist And Similar Insults”. Associated Press. https://apnews.com/article/129be63ab91e148b68ca858a99b1f08c.
- Peters, Jay. 2020. “Twitter Is Testing A Way To Let You Limit Replies To Your Tweets”. The Verge. https://www.theverge.com/2020/5/20/21265090/twitter-testing-limited-replies-tweets-conversations.
- Frenkel, Sheera, Mike Isaac, Cecilia Kang, and Gabriel J.X. Dance. 2020. “Facebook Employees Stage Virtual Walkout To Protest Trump Posts”. The New York Times. https://www.nytimes.com/2020/06/01/technology/facebook-employee-protest-trump.html.